On Tridiagonalizing and Diagonalizing Symmetric Matrices with Repeated Eigenvalues
نویسندگان
چکیده
We describe a divide-and-conquer tridiagonalizationapproach for matrices with repeated eigenvalues. Our algorithmhinges on the fact that, under easily constructivelyveriiable conditions,a symmetricmatrix with bandwidth b and k distinct eigenvalues must be block diagonal with diagonal blocks of size at most bk. A slight modiication of the usual orthogonal band-reduction algorithm allows us to reveal this structure, which then leads to potential parallelism in the form of independent diagonal blocks. Compared with the usual Householder reduction algorithm, the new approach exhibits improved data locality, signiicantly more scope for parallelism, and the potential to reduce arithmetic complexity by close to 50% for matrices that have only two numerically distinct eigenvalues. The actual improvement depends to a large extent on the number of distinct eigenvalues and a good estimate thereof. However, at worst the algorithm behaves like a successive bandreduction approach to tridiagonalization. Moreover, we provide a numerically reliable and eeective algorithm for computing the eigenvalue decomposition of a symmetric matrix with two numerically distinct eigenvalues. Such matrices arise, for example, in invariant subspace decomposition approaches to the symmetric eigenvalue problem.
منابع مشابه
Block Lanczos Tridiagonalization of Complex Symmetric Matrices
The classic Lanczos method is an effective method for tridiagonalizing real symmetric matrices. Its block algorithm can significantly improve performance by exploiting memory hierarchies. In this paper, we present a block Lanczos method for tridiagonalizing complex symmetric matrices. Also, we propose a novel componentwise technique for detecting the loss of orthogonality to stablize the block ...
متن کاملII.G Gaussian Integrals
It can be reduced to a product of N one dimensional integrals by diagonalizing the matrix K ≡ Ki,j . Since we need only consider symmetric matrices (Ki,j = Kj,i), the eigenvalues are real, and the eigenvectors can be made orthonormal. Let us denote the eigenvectors and eigenvalues of K by q̂ and Kq respectively, i.e. Kq̂ = Kqq̂. The vectors {q̂} form a new coordinate basis in the original N dimensi...
متن کاملII.G Gaussian Integrals
It can be reduced to a product of N one dimensional integrals by diagonalizing the matrix K ≡ Ki,j . Since we need only consider symmetric matrices (Ki,j = Kj,i), the eigenvalues are real, and the eigenvectors can be made orthonormal. Let us denote the eigenvectors and eigenvalues of K by q̂ and Kq respectively, i.e. Kq̂ = Kqq̂. The vectors {q̂} form a new coordinate basis in the original N dimensi...
متن کاملAn Explicit Formula for Lanczos Polynomials*
The Lanczos algorithm for tridiagonalizing a given matrix A generates a sequence of approximating matrices A, that can naturally be obtained as restrictions to subspaces. The eigenvalues of these approximating matrices are well known to be good approximations to the extreme eigenvalues of A. In this paper we produce explicit formulas for the characteristic polynomials of the A,, in terms of the...
متن کاملA mathematically simple method based on denition for computing eigenvalues, generalized eigenvalues and quadratic eigenvalues of matrices
In this paper, a fundamentally new method, based on the denition, is introduced for numerical computation of eigenvalues, generalized eigenvalues and quadratic eigenvalues of matrices. Some examples are provided to show the accuracy and reliability of the proposed method. It is shown that the proposed method gives other sequences than that of existing methods but they still are convergent to th...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM J. Matrix Analysis Applications
دوره 17 شماره
صفحات -
تاریخ انتشار 1996